• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

Ȩ Ȩ > ¿¬±¸¹®Çå >

Current Result Document :

ÇѱÛÁ¦¸ñ(Korean Title) »çÀü ÈÆ·ÃµÈ ¸ðµ¨À» È°¿ëÇÑ ¹ö±× º¸°í¼­ ¿ä¾à
¿µ¹®Á¦¸ñ(English Title) Bug Report Summarization using Pretrained Models
ÀúÀÚ(Author) ¹«Èåµû¸£ »ç¸»   À̼±¾Æ   Samal Mukhtar   Seonah Lee  
¿ø¹®¼ö·Ïó(Citation) VOL 49 NO. 01 PP. 0279 ~ 0281 (2022. 06)
Çѱ۳»¿ë
(Korean Abstract)
¿µ¹®³»¿ë
(English Abstract)
To the best of our knowledge, previous works for bug summarization have not utilized pretrained models. Our work implements two popular bug report benchmark datasets on pretrained models BART and T5. Considering difficulties in utilizing abstractive patterns for bug summarization, we propose an approach that improves the summarization accuracy of the T5 more than two times. Our approach helps preserve the details in the input data that can be missed by the T5 architecture. To evaluate the accuracy of our approach, we compared the results of our approach with that of the state-of-the-art approach BugSum. Our experiment shows that our proposed approach based on BART and T5 outperform existing method BugSum over two public datasets. Our work shows a potential for obtaining the higher accuracy of bug summarization with pretrained models.
Å°¿öµå(Keyword)
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå